1,702 research outputs found

    Is the even distribution of insecticide-treated cattle essential for tsetse control? Modelling the impact of baits in heterogeneous environments

    Get PDF
    Background: Eliminating Rhodesian sleeping sickness, the zoonotic form of Human African Trypanosomiasis, can be achieved only through interventions against the vectors, species of tsetse (Glossina). The use of insecticide-treated cattle is the most cost-effective method of controlling tsetse but its impact might be compromised by the patchy distribution of livestock. A deterministic simulation model was used to analyse the effects of spatial heterogeneities in habitat and baits (insecticide-treated cattle and targets) on the distribution and abundance of tsetse. Methodology/Principal Findings: The simulated area comprised an operational block extending 32 km from an area of good habitat from which tsetse might invade. Within the operational block, habitat comprised good areas mixed with poor ones where survival probabilities and population densities were lower. In good habitat, the natural daily mortalities of adults averaged 6.14% for males and 3.07% for females; the population grew 8.46in a year following a 90% reduction in densities of adults and pupae, but expired when the population density of males was reduced to <0.1/km2; daily movement of adults averaged 249 m for males and 367 m for females. Baits were placed throughout the operational area, or patchily to simulate uneven distributions of cattle and targets. Gaps of 2–3 km between baits were inconsequential provided the average imposed mortality per km2 across the entire operational area was maintained. Leaving gaps 5–7 km wide inside an area where baits killed 10% per day delayed effective control by 4–11 years. Corrective measures that put a few baits within the gaps were more effective than deploying extra baits on the edges. Conclusions/Significance: The uneven distribution of cattle within settled areas is unlikely to compromise the impact of insecticide-treated cattle on tsetse. However, where areas of >3 km wide are cattle-free then insecticide-treated targets should be deployed to compensate for the lack of cattle

    Position Sensing from Charge Dispersion in Micro-Pattern Gas Detectors with a Resistive Anode

    Full text link
    Micro-pattern gas detectors, such as the Gas Electron Multiplier (GEM) and the Micromegas need narrow high density anode readout elements to achieve good spatial resolution. A high-density anode readout would require an unmanageable number of electronics channels for certain potential micro-detector applications such as the Time Projection Chamber. We describe below a new technique to achieve good spatial resolution without increasing the electronics channel count in a modified micro-detector outfitted with a high surface resistivity anode readout structure. The concept and preliminary measurements of spatial resolution from charge dispersion in a modified GEM detector with a resistive anode are described below.Comment: 14 pages, 8 figures, submitted to Nucl. Inst. Meth; typo in eqn 4 corrected, fig 2 updated accordingl

    Spatial resolution of a GEM readout TPC using the charge dispersion signal

    Get PDF
    A large volume Time Projection Chamber (TPC) is being considered for the central charged particle tracker for the detector for the proposed International Linear Collider (ILC). To meet the ILC-TPC spatial resolution challenge of ~100 microns with a manageable number of readout pads and channels of electronics, Micro Pattern Gas Detectors (MPGD) are being developed which could use pads comparable in width to the proportional-wire/cathode-pad TPC. We have built a prototype GEM readout TPC with 2 mm x 6 mm pads using the new concept of charge dispersion in MPGDs with a resistive anode. The dependence of transverse resolution on the drift distance has been measured for small angle tracks in cosmic ray tests without a magnetic field for Ar/CO2 (90:10). The GEM-TPC resolution with charge dispersion readout is significantly better than previous measurements carried out with conventional direct charge readout techniques.Comment: 5 figures, 10 page

    The Potential Role of Intraoperative Ultrasonography in the Surgical Treatment of Hilar Cholangiocarci noma

    Get PDF
    The role of intraoperative ultrasonography (IOU) in the surgical treatment of hilar cholangiocarcinoma was explored in twenty-two patients, 17 males and 5 females. The mean age was 55 years (range 36-78 years). Preoperative imaging studies included abdominal ultra-sonography and/or CT scan, and visceral angiography. Operations performed were segment III bypass in 18 patients, local resection of tumour in 2 and resection of tumour en bloc with left hepatectomy in 2. Interpretation of IOU in terms of vascular involvement by the tumour (as compared to angiography or operative findings) was correct in 21 patients; no vascular invasion in 20 and portal vein invasion in the remainder. One false negative result occurred in a patient whose IOU failed to show right hepatic artery encasement by the tumour. When compared to postoperative cholangiography or surgical specimen, IOU correctly demon-strated location and extent of the tumours in all but one patient who had incomplete tumour resection. IOU was also helpful in locating segment III duct for biliary bypass. The mean time used for IOU was 15.1 min (range 10-20 min.), and there was no procedure-related com-plication. When supplemented with operative exploration, IOU seems to be very useful in the assessment of the resectability of hilar cholangiocarcinoma

    Use of Current 2010 Forest Disturbance Monitoring Products for the Conterminous United States in Aiding a National Forest Threat Early Warning System

    Get PDF
    This presentation discusses contributions of near real time (NRT) MODIS forest disturbance detection products for the conterminous United States to an emerging national forest threat early warning system (EWS). The latter is being developed by the USDA Forest Service s Eastern and Western Environmental Threat Centers with help from NASA Stennis Space Center and the Oak Ridge National Laboratory. Building off work done in 2009, this national and regional forest disturbance detection and viewing capability of the EWS employs NRT MODIS NDVI data from the USGS eMODIS group and historical NDVI data from standard MOD13 products. Disturbance detection products are being computed for 24 day composites that are refreshed every 8 days. Products for 2010 include 42 dates of the 24 day composites. For each compositing date, we computed % change in forest maximum NDVI products for 2010 with respect to each of three historical baselines of 2009, 2007-2009, and 2003-2009,. The three baselines enable one to view potential current, recent, and longer term forest disturbances. A rainbow color table was applied to each forest change product so that potential disturbances (NDVI drops) were identified in hot color tones and growth (NDVI gains) in cold color tones. Example products were provided to end-users responsible for forest health monitoring at the Federal and State levels. Large patches of potential forest disturbances were validated based on comparisons with available reference data, including Landsat and field survey data. Products were posted on two internet mapping systems for US Forest Service internal and collaborator use. MODIS forest disturbance detection products were computed and posted for use in as little as 1 day after the last input date of the compositing period. Such products were useful for aiding aerial disturbance detection surveys and for assessing disturbance persistence on both inter- and intra-annual scales. Multiple 2010 forest disturbance events were detected across the nation, including damage from ice storms, tornadoes, caterpillars, bark beetles, and wildfires. This effort enabled improved NRT forest disturbance monitoring capabilities for this nation-wide forest threat EWS

    Review of broad-scale drought monitoring of forests: Toward an integrated data mining approach

    Get PDF
    Efforts to monitor the broad-scale impacts of drought on forests often come up short. Drought is a direct stressor of forests as well as a driver of secondary disturbance agents, making a full accounting of drought impacts challenging. General impacts can be inferred from moisture deficits quantified using precipitation and temperature measurements. However, derived meteorological indices may not meaningfully capture drought impacts because drought responses can differ substantially among species, sites and regions. Meteorology-based approaches also require the characterization of current moisture conditions relative to some specified time and place, but defining baseline conditions over large, ecologically diverse regions can be as difficult as quantifying the moisture deficit itself. In contrast, remote sensing approaches attempt to observe immediate, secondary, and longer-term changes in vegetation response, yet they too are no panacea. Remote sensing methods integrate responses across entire mixed-vegetation pixels and rarely distinguish the effects of drought on a single species, nor can they disentangle drought effects from those caused by various other disturbance agents. Establishment of suitable baselines from remote sensing may be even more challenging than with meteorological data. Here we review broad-scale drought monitoring methods, and suggest that an integrated data-mining approach may hold the most promise for enhancing our ability to resolve drought impacts on forests. A big-data approach that integrates meteorological and remotely sensed data streams, together with other data sets such as vegetation type, wildfire occurrence and pest activity, can clarify direct drought effects while filtering indirect drought effects and consequences. This strategy leverages the strengths of meteorology-based and remote sensing approaches with the aid of ancillary data, such that they complement each other and lead toward a better understanding of drought impacts

    Antiretroviral therapy initiated soon after HIV diagnosis as standard care: potential to save lives?

    Get PDF
    In 2008, an estimated 33.4 million people were infected with human immunodeficiency virus (HIV) and ~4 million people were receiving antiretroviral therapy (ART). However, in 2007, an estimated 6.7 million people were in need of ART under the current World Health Organization guidelines, and 2.7 million more people became infected with HIV. Most of those not currently eligible for ART will become eligible within the next decade, making the current treatment strategy unsustainable. The development of cheaper, less toxic, and more potent antiretrovirals over the past decade has made it possible to consider novel strategies of arresting the HIV/AIDS epidemic. Evidence is growing that ART can be used to prevent HIV transmission and that earlier initiation of treatment is beneficial for those infected with HIV. A mathematical model predicts that by testing whole communities annually and treating all who are infected immediately, up to 7.2 million AIDS-related deaths could be prevented in the next 40 years, long-term funding required to fight the HIV epidemic could be reduced, and, most importantly, control of the HIV/ AIDS epidemic could be regained within 1–2 years of full-scale implementation of the strategy. We discuss the development of the concept of ART for the prevention of HIV transmission and the modeled impact that a test-and-treat strategy could have on the HIV epidemic, and consequently argue that a field trial should be carried out to confirm model parameters, highlight any practical problems, and test the model’s predictions
    corecore